Adaptive Constraint Reduction for Convex Quadratic Programming and Training Support Vector Machines
نویسنده
چکیده
Title of dissertation: Adaptive Constraint Reduction for Convex Quadratic Programming and Training Support Vector Machines Jin Hyuk Jung, Doctor of Philosophy, 2008 Dissertation directed by: Professor Dianne P. O’Leary Department of Computer Science Convex quadratic programming (CQP) is an optimization problem of minimizing a convex quadratic objective function subject to linear constraints. We propose an adaptive constraint reduction primal-dual interior-point algorithm for convex quadratic programming with many more constraints than variables. We reduce the computational effort by assembling the normal equation matrix with a subset of the constraints. Instead of the exact matrix, we compute an approximate matrix for a well chosen index set which includes indices of constraints that seem to be most critical. Starting with a large portion of the constraints, our proposed scheme excludes more unnecessary constraints at later iterations. We provide proofs for the global convergence and the quadratic local convergence rate of an affine scaling variant. A similar approach can be applied to Mehrotra’s predictor-corrector type algorithms. An example of CQP arises in training a linear support vector machine (SVM), which is a popular tool for pattern recognition. The difficulty in training a support vector machine (SVM) lies in the typically vast number of patterns used for the training process. In this work, we propose an adaptive constraint reduction primaldual interior-point method for training the linear SVM with l1 hinge loss. We reduce the computational effort by assembling the normal equation matrix with a subset of well-chosen patterns. Starting with a large portion of the patterns, our proposed scheme excludes more and more unnecessary patterns as the iteration proceeds. We extend our approach to training nonlinear SVMs through Gram matrix approximation methods. Promising numerical results are reported. Adaptive Constraint Reduction for Convex Quadratic Programming and Training Support Vector Machines by Jin Hyuk Jung Dissertation submitted to the Faculty of the Graduate School of the University of Maryland, College Park in partial fulfillment of the requirements for the degree of Doctor of Philosophy 2008 Advisory Commmittee: Professor Dianne P. O’Leary, Chair/Advisor Professor Kyu Yong Choi Professor Hanan Samet Professor G. W. Stewart Professor André L. Tits c © Copyright by Jin Hyuk Jung 2008
منابع مشابه
Adaptive Constraint Reduction for Training Support Vector Machines
A support vector machine (SVM) determines whether a given observed pattern lies in a particular class. The decision is based on prior training of the SVM on a set of patterns with known classification, and training is achieved by solving a convex quadratic programming problem. Since there are typically a large number of training patterns, this can be expensive. In this work, we propose an adapt...
متن کاملA Polynomial Time Constraint-Reduced Algorithm for Semidefinite Optimization Problems
We propose a new infeasible primal-dual predictor-corrector interior-point method (IPM) with adaptive criteria for constraint reduction. The algorithm is a modification of one with no constraint reduction, due to Potra and Sheng (2006). Our algorithm can be applied when the data matrices are block diagonal. It thus solves as special cases any optimization problem that is linear, convex quadrati...
متن کاملA parallel solver for large quadratic programs in training support vector machines
This work is concerned with the solution of the convex quadratic programming problem arising in training the learning machines named support vector machines. The problem is subject to box constraints and to a single linear equality constraint; it is dense and, for many practical applications, it becomes a large-scale problem. Thus, approaches based on explicit storage of the matrix of the quadr...
متن کاملA tutorial on support vector regression
In this tutorial we give an overview of the basic ideas underlying Support Vector (SV) machines for function estimation. Furthermore, we include a summary of currently used algorithms for training SV machines, covering both the quadratic (or convex) programming part and advanced methods for dealing with large datasets. Finally, we mention some modifications and extensions that have been applied...
متن کاملA parallel training algorithm for large scale support vector machines
Support vector machines (SVMs) are an extremely successful class of classification and regression algorithms. Building an SVM entails the solution of a constrained convex quadratic programming problem which is quadratic in the number of training samples. Previous parallel implementations of SVM solvers sequentially solved subsets of the complete problem, which is problematic when the solution r...
متن کامل